Kernel-Free Quadratic Surface Support Vector Regression with Non-Negative Constraints
نویسندگان
چکیده
In this paper, a kernel-free quadratic surface support vector regression with non-negative constraints (NQSSVR) is proposed for the problem. The task of NQSSVR to find function as function. By utilizing technique, model avoids difficulty choosing kernel and corresponding parameters, has interpretability certain extent. fact, data may have priori information that value response variable will increase explanatory grows in interval. Moreover, order ensure monotonically increasing on interval, respect coefficients are introduced construct optimization problem NQSSVR. And obtained by matches information, which been proven theoretical analysis. addition, existence uniqueness solution primal dual NQSSVR, relationship between them addressed. Experimental results two artificial datasets seven benchmark validate feasibility effectiveness our approach. Finally, method verified real examples air quality.
منابع مشابه
Support vector regression with random output variable and probabilistic constraints
Support Vector Regression (SVR) solves regression problems based on the concept of Support Vector Machine (SVM). In this paper, a new model of SVR with probabilistic constraints is proposed that any of output data and bias are considered the random variables with uniform probability functions. Using the new proposed method, the optimal hyperplane regression can be obtained by solving a quadrati...
متن کاملKernel Support Vector Regression with imprecise output ∗
We consider a regression problem where uncertainty affects to the dependent variable of the elements of the database. A model based on the standard -Support Vector Regression approach is given, where two hyperplanes need to be constructed to predict the interval-valued dependent variable. By using the Hausdorff distance to measure the error between predicted and real intervals, a convex quadrat...
متن کاملsupport vector regression with random output variable and probabilistic constraints
support vector regression (svr) solves regression problems based on the concept of support vector machine (svm). in this paper, a new model of svr with probabilistic constraints is proposed that any of output data and bias are considered the random variables with uniform probability functions. using the new proposed method, the optimal hyperplane regression can be obtained by solving a quadrati...
متن کاملMultiple Kernel Learning for Support Vector Regression ∗
Kernel support vector (SV) regression has successfully been used for prediction of nonlinear and complicated data. However, like other kernel methods such as support vector machine (SVM) classification, the quality of SV regression depends on proper choice of kernel functions and their parameters. Kernel selection for model selection is conventionally performed through repeated cross validation...
متن کاملSupport Vector Regression with a Generalized Quadratic Loss
The standard SVR formulation for real-valued function approximation on multidimensional spaces is based on the -insensitive loss function, where errors are considered not correlated. Due to this, local information in the feature space which can be useful to improve the prediction model is disregarded. In this paper we address this problem by defining a generalized quadratic loss where the co-oc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Entropy
سال: 2023
ISSN: ['1099-4300']
DOI: https://doi.org/10.3390/e25071030